Quality assessment in crowdsourced classification tasks
نویسندگان
چکیده
منابع مشابه
Quality Control for Crowdsourced Enumeration Tasks
Quality control is one of the central issues in crowdsourcing research. In this paper, we consider a quality control problem of crowdsourced enumeration tasks that request workers to enumerate possible answers as many as possible. Since workers neither necessarily provide correct answers nor provide exactly the same answers even if the answers indicate the same idea, we propose a two-stage qual...
متن کاملQuality Assessment for Crowdsourced Object Annotations
As computer vision datasets grow larger the community is increasingly relying on crowdsourced annotations to train and test our algorithms. Due to the heterogeneous and unpredictable capability of online annotators, various strategies have been proposed to “clean” crowdsourced annotations. However, these strategies typically involve getting more annotations, perhaps different types of annotatio...
متن کاملQuality-control mechanism utilizing worker's confidence for crowdsourced tasks
We propose a quality control mechanism that utilizes workers’ self-reported confidences in crowdsourced labeling tasks. Generally, a worker has confidence in the correctness of her answers, and asking about it is useful for estimating the probability of correctness. However, we need to overcome two main obstacles in order to to use confidence for inferring correct answers. First, a worker is no...
متن کاملReducing Error in Context-Sensitive Crowdsourced Tasks
The recent growth of global crowdsourcing platforms has enabled businesses to leverage the time and expertise of workers world-wide with low overhead and at low cost. In order to utilize such platforms, one must decompose work into tasks that can be distributed to crowd workers. To this end, platform vendors provide task interfaces at varying degrees of granularity, from short, simple microtask...
متن کاملLingoTurk: managing crowdsourced tasks for psycholinguistics
LingoTurk is an open-source, freely available crowdsourcing client/server system aimed primarily at psycholinguistic experimentation where custom and specialized user interfaces are required but not supported by popular crowdsourcing task management platforms. LingoTurk enables user-friendly local hosting of experiments as well as condition management and participant exclusion. It is compatible...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Crowd Science
سال: 2019
ISSN: 2398-7294,2398-7294
DOI: 10.1108/ijcs-06-2019-0017